8 research outputs found

    Security assessment of the smart grid : a review focusing on the NAN architecture

    Get PDF
    Abstract: This paper presents a comprehensive review on the security aspect of the smart grid communication network. The paper focus on the Neighborhood Area Network (NAN) cybersecurity and it laid emphasis on how the NAN architecture is such an attractive target to intruders and attackers. The paper aims at summarizing recent research efforts on some of the attacks and the various techniques employed in tackling them as they were discussed in recent literatures and research works. Furthermore, the paper presents a detailed review on the smart grid communication layers, wireless technology standards, networks and the security challenges the grid is currently facing. The work concludes by explaining current and future directions NAN communication security could consider in terms of data privacy measures. The data privacy measures are discussed in terms of prevention and detection techniques

    Smart home appliances scheduling to manage energy usage

    Get PDF
    Abstract: It is imperative to manage household appliances in a cost-effective way to realize efficient energy utilization, reduce spending on electricity bills and increase grid reliability. This study presents a Home energy management system (HEMS) scheduling analysis. The scheduling plan avoids the electricity wastages which arise majorly due resident’s negligence on appliances control. The home appliances employed in the research work are classified in terms of their operating periods. The energy consumption was evaluated using a Fixed Pricing (FP) data. The appliances scheduling plan was developed using Microsoft.net framework with C# Programming language whereas the front end showing the scheduled operating periods for the appliances was developed using Telerik UI framework for Windows forms. Simulation results of the scheduling plan show energy consumption in homes can be planned, monitored and controlled to avoid energy wastage and minimize energy expenditure

    Real time security assessment of the power system using a hybrid support vector machine and multilayer perceptron neural network algorithms

    Get PDF
    Abstract : In today’s grid, the technological based cyber-physical systems have continued to be plagued with cyberattacks and intrusions. Any intrusive action on the power system’s Optimal Power Flow (OPF) modules can cause a series of operational instabilities, failures, and financial losses. Real time intrusion detection has become a major challenge for the power community and energy stakeholders. Current conventional methods have continued to exhibit shortfalls in tackling these security issues. In order to address this security issue, this paper proposes a hybrid Support Vector Machine and Multilayer Perceptron Neural Network (SVMNN) algorithm that involves the combination of Support Vector Machine (SVM) and multilayer perceptron neural network (MPLNN) algorithms for predicting and detecting cyber intrusion attacks into power system networks. In this paper, a modified version of the IEEE Garver 6-bus test system and a 24-bus system were used as case studies. The IEEE Garver 6-bus test system was used to describe the attack scenarios, whereas load flow analysis was conducted on real time data of a modified Nigerian 24-bus system to generate the bus voltage dataset that considered several cyberattack events for the hybrid algorithm. Sising various performance metricion and load/generator injections, en included in the manuscriptmulation results showed the relevant influences of cyberattacks on power systems in terms of voltage, power, and current flows. To demonstrate the performance of the proposed hybrid SVMNN algorithm, the results are compared with other models in related studies. The results demonstrated that the hybrid algorithm achieved a detection accuracy of 99.6%, which is better than recently proposed schemes

    Power system events classification using genetic algorithm based feature weighting technique for support vector machine

    Get PDF
    Abstract: Currently, ensuring that power systems operate efficiently in stable and secure conditions has become a key challenge worldwide. Various unwanted events including injections and faults, especially within the generation and transmission domains are major causes of these instability menaces. The earlier operators can identify and accurately diagnose these unwanted events, the faster they can react and execute timely corrective measures to prevent large-scale blackouts and avoidable loss to lives and equipment. This paper presents a hybrid classification technique using support vector machine (SVM) with the evolutionary genetic algorithm (GA) model to detect and classify power system unwanted events in an accurate yet straightforward manner. In the proposed classification approach, the features of two large dimensional synchrophasor datasets are initially reduced using principal component analysis before they are weighted in their relevance and the dominant weights are heuristically identified using the genetic algorithm to boost classification results. Consequently, the weighted and dominant selected features by the GA are utilized to train the modelled linear SVM and radial basis function kernel SVM in classifying unwanted events. The performance of the proposed GA-SVM model was evaluated and compared with other models using key classification metrics. The high classification results from the proposed model validates the proposed method. The experimental results indicate that the proposed model can achieve an overall improvement in the classification rate of unwanted events in power systems and it showed that the application of the GA as the feature weighting tool offers significant improvement on classification performances

    A review of research works on supervised learning algorithms for SCADA intrusion detection and classification

    Get PDF
    Abstract: Supervisory Control and Data Acquisition (SCADA) systems play a significant role in providing remote access, monitoring and control of critical infrastructures (CIs) which includes electrical power systems, water distribution systems, nuclear power plants, etc. The growing interconnectivity, standardization of communication protocols and remote accessibility of modern SCADA systems have contributed massively to the exposure of SCADA systems and CIs to various forms of security challenges. Any form of intrusive action on the SCADA modules and communication networks can create devastating consequences on nations due to their strategic importance to CIs’ operations. Therefore, the prompt and efficient detection and classification of SCADA systems intrusions hold great importance for national CIs operational stability. Due to their well-recognized and documented efficiencies, several literature works have proposed numerous supervised learning techniques for SCADA intrusion detection and classification (IDC). This paper presents a critical review of recent studies whereby supervised learning techniques were modelled for SCADA intrusion solutions. The paper aims to contribute to the state-of-the-art, recognize critical open issues and offer ideas for future studies. The intention is to provide a research-based resource for researchers working on industrial control systems security. The analysis and comparison of different supervised learning techniques for SCADA IDC systems were critically reviewed, in terms of the methodologies, datasets and testbeds used, feature engineering and optimization mechanisms and classification procedures. Finally, we briefly summarized some suggestions and recommendations for future research works

    Using call admission control and call duration control for mobile network congestion management

    Get PDF
    M.Tech. (Electrical Engineering)Abstract: Wireless communications have experienced massive development remarkably in the past decade. Networks such as global system for mobile communications (GSM) and universal mobile telecommunication service (UMTS) has enjoyed enormous patronization, hence leading to massive mobile network congestion. The problem of network congestion is a network managerial issue that affects the Quality of Service (QoS) rendered by a network. Hence for the sustainability of the system, there is a need to fully manage the radio resources during peak and off peak periods. Call Admission Control (CAC) schemes are constantly being used extensively in managing mobile network congestion. CAC is an approach that can provide credible QoS by regulating the number of connections into the cellular network thereby allowing good use of the radio resources, reducing network congestions, interference and other QoS problems. Network resources cannot be available for all users at all times especially during busy hour traffic. Hence, these resources require effective and efficient allocation so that more subscribers are being allowed to use the network irrespective of the network traffic at a particular reference time. Mobile network congestion can be controlled by suppressing the mobile network traffic demand. The traffic demand varies proportionally with average rate of call arrival and the average duration of the calls. Hence marginalizing the average call duration will minimize traffic load, thereby reducing network congestion. Some network users due to their lifestyle of being affluent or business demands, holds on to a particular channel during long duration of calls at the detriment of other network users. CAC scheme cannot solve congestion problems due to the selfishness of these particular sets of users. Here, the combination of a channel reservation CAC scheme with Call Duration Control (CDC) scheme was proposed, in which users that have stayed over a predetermined period in the network will be served a termination notice so as to make the channel available for new users. The motive is to provide the available channels to accommodate more users. A simulation-based approach was used to model the combination of the two schemes and the combination produces good results in reducing the congestion menace. The network type deployed in the research is GSM. The result of the CAC/CDC combination schemes were compared to the result of the ordinary CAC schemes in order to verify the impact of the..

    Refined LSTM Based Intrusion Detection for Denial-of-Service Attack in Internet of Things

    No full text
    The Internet of Things (IoT) is a promising technology that allows numerous devices to be connected for ease of communication. The heterogeneity and ubiquity of the various connected devices, openness to devices in the network, and, importantly, the increasing number of connected smart objects (or devices) have exposed the IoT network to various security challenges and vulnerabilities which include manipulative data injection and cyberattacks such as a denial of service (DoS) attack. Any form of intrusive data injection or attacks on the IoT networks can create devastating consequences on the individual connected device or the entire network. Hence, there is a crucial need to employ modern security measures that can protect the network from various forms of attacks and other security challenges. Intrusion detection systems (IDS) and intrusion prevention systems have been identified globally as viable security solutions. Several traditional machine learning methods have been deployed as IoT IDS. However, the methods have been heavily criticized for poor performances in handling voluminous datasets, as they rely on domain expertise for feature extraction among other reasons. Thus, there is a need to devise better IDS models that can handle the IoT voluminous datasets efficiently, cater to feature extraction, and perform reasonably well in terms of overall performance. In this paper, an IDS based on redefined long short-term memory deep learning approach is proposed for detecting DoS attacks in IoT networks. The model was tested on benchmark datasets; CICIDS-2017 and NSL-KDS datasets. Three pre-processing procedures, which include encoding, dimensionality reduction, and normalization were deployed for the datasets. Using key classification metrics, experimental results obtained show that the proposed model can effectively detect DoS attacks in IoT networks as it performs better compared to other methods including models from related works

    Refined LSTM Based Intrusion Detection for Denial-of-Service Attack in Internet of Things

    No full text
    The Internet of Things (IoT) is a promising technology that allows numerous devices to be connected for ease of communication. The heterogeneity and ubiquity of the various connected devices, openness to devices in the network, and, importantly, the increasing number of connected smart objects (or devices) have exposed the IoT network to various security challenges and vulnerabilities which include manipulative data injection and cyberattacks such as a denial of service (DoS) attack. Any form of intrusive data injection or attacks on the IoT networks can create devastating consequences on the individual connected device or the entire network. Hence, there is a crucial need to employ modern security measures that can protect the network from various forms of attacks and other security challenges. Intrusion detection systems (IDS) and intrusion prevention systems have been identified globally as viable security solutions. Several traditional machine learning methods have been deployed as IoT IDS. However, the methods have been heavily criticized for poor performances in handling voluminous datasets, as they rely on domain expertise for feature extraction among other reasons. Thus, there is a need to devise better IDS models that can handle the IoT voluminous datasets efficiently, cater to feature extraction, and perform reasonably well in terms of overall performance. In this paper, an IDS based on redefined long short-term memory deep learning approach is proposed for detecting DoS attacks in IoT networks. The model was tested on benchmark datasets; CICIDS-2017 and NSL-KDS datasets. Three pre-processing procedures, which include encoding, dimensionality reduction, and normalization were deployed for the datasets. Using key classification metrics, experimental results obtained show that the proposed model can effectively detect DoS attacks in IoT networks as it performs better compared to other methods including models from related works
    corecore